library(tidyverse)
library(printr)

require(readxl)
require(jpeg)
require(ggimg)

Input and preparation

Yet Zero produces output in CSV format, containing meta data and measured coordinates. Per session, one CSV file is created. During preparation we read in all CSV files, combine them into a single data frame, and add external meta data on stimuli and areas of interest (AOI).

For this process, the following standard files and folders are used:

  1. A folder “Data/” containing all CSV files produced by Yet Zero
  2. A folder “Stimuli/” containing all stimulus images
  3. A file “Stimuli/Stimuli.csv” containing meta data on the stimuli
  4. A file “Stimuli/AOI.csv” containing definitions of areas of interest (AOI) on the stimuli

We begin with reading in in th stimulus data and how this can be put into static ggplot figures. This is particularly useful for checking the AOI definitions visually.

Reading Stimulus meta data

The following function reads in the stimulus meta data from the specified CSV file, and adds the image paths and image data. The col_types = col(...) argument specifies the expected data types of each column in the CSV file. Always remember that CSV data is of type Character, including the numbers. read_csv often guesses the data type correctly, but there exist fringe cases, where this fails.

col_types_Stim <- cols(  File = col_character(),
                            width = col_double(),
                            height = col_double(),
                            hum_like = col_double(),
                            hum_skull = col_logical(),
                            hum_eye = col_logical(),
                            Face = col_character())



read_stim <- 
  function(file = "Stimuli/Stimuli.csv", 
           stim_dir = "Stimuli/",
           col_types = col_types_Stim
           ){
    stimuli <-
      read_csv(file,
               col_types = col_types) %>% 
      mutate(Path = str_c(stim_dir, File, sep = ""))
    return(stimuli)    
  }

read_stim()
File width height hum_like hum_skull hum_eye Face Path
18.jpg 450 450 38 FALSE TRUE 18 Stimuli/18.jpg
27manipulated2.jpg 450 450 39 FALSE FALSE 27 Stimuli/27manipulated2.jpg
29.jpg 450 450 75 FALSE TRUE 29 Stimuli/29.jpg
34manipulated3.jpg 450 450 72 FALSE FALSE 34 Stimuli/34manipulated3.jpg
35.jpg 450 450 75 FALSE TRUE 35 Stimuli/35.jpg
37manipulated2.jpg 450 450 75 FALSE FALSE 37 Stimuli/37manipulated2.jpg
18manipulated.jpg 450 450 38 FALSE FALSE 18 Stimuli/18manipulated.jpg
75.jpg 450 450 40 FALSE TRUE 75 Stimuli/75.jpg
29manipulated4.jpg 450 450 75 FALSE FALSE 29 Stimuli/29manipulated4.jpg
34.jpg 450 450 72 FALSE TRUE 34 Stimuli/34.jpg
27.jpg 450 450 39 FALSE FALSE 27 Stimuli/27.jpg
95manipulated3.jpg 450 450 93 TRUE FALSE 95 Stimuli/95manipulated3.jpg
92.jpg 450 450 87 TRUE TRUE 92 Stimuli/92.jpg
75manipulated2.jpg 450 450 40 FALSE FALSE 75 Stimuli/75manipulated2.jpg
35manipulated3.jpg 450 450 75 FALSE FALSE 35 Stimuli/35manipulated3.jpg
37.jpg 450 450 75 FALSE TRUE 37 Stimuli/37.jpg
95.jpg 450 450 93 TRUE TRUE 95 Stimuli/95.jpg
92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg

Here we use read_stim to create the Stimulus table. We do a little recoding, if needed. The file name acts as an identifier for the stimuli.

Stimuli <- read_stim() %>% 
      mutate(Skull = if_else( hum_skull, "human", "ape"),
             Sclera = if_else( hum_eye, "human", "ape"),
             Stim = File)

Making a stimulus grid

A useful way to visualize eye tracking data is to use a grid of the stimuli (like a thumbnail sheet) and overlay the eye tracking data on top of it. This can take different forms, but the underlying grid is always the same. Here we create a re-usable ggplot object G_0 containing a grid of pictures.

G_0 <- 
  Stimuli %>% 
    ggplot(aes(xmin = 0, xmax = width, 
                        ymin = 0, ymax = height)) +
    facet_wrap(~Stim) +
    ggimg::geom_rect_img(aes(img = Path))
G_0

Reading csv

The following code contains two functions. The first function read_yz_csv reads in a single CSV file, removes duplicate rows (which can occur when the eye tracker loses track of the eye). The second function read_yz_files takes a list of file names and reads them, combines them into a single data frame, and adds stimulus meta data from the stimulus table created above.

** Notes **

The map_df() function is a so called iterator. Iterators are called super functions, where one of the function arguments is a function itself. In R iterators are the default way to do an operation repeatedly over a data set. Now you know why tidy R never uses for loops, like in Python. And you know one reason why R is called a functional programming language!

There is another famous piece of data science at work here. Think about the problem of adding the Stimulus meta data to the eye tracking data. Try to picture how you would do it manually:

  1. Identify matching Stimulus ID in both tables
  2. Copy the meta data from the Stimulus table to the eye tracking data table
  3. Repeat for all rows in the eye tracking data table.

This operation is called a join in Data Science. A precondition for joins is that the table have matching identifiers to create the relationship. Here this is the Stim column.

read_yz_csv <- function(file){
  read_csv(file, 
           col_types = cols(Exp = col_character(), Part = col_character(), 
                            Stim = col_character(), time = col_double(), 
                            x = col_double(), y = col_double(), 
                            x_pro = col_double(), y_pro = col_double())) %>% 
    mutate(is_duplicate = x == lag(x) & y == lag(y)) %>% 
    filter(!is_duplicate) %>% 
    mutate(File = file) %>% 
    select(Exp, Part, Stim, time, x, y, x_pro, y_pro)
}

read_yz_data <- function(files, stim_tab){
  Data <- 
    files %>% 
    map_df(read_yz_csv) %>% 
    mutate(Obs  = row_number()) %>%
    mutate(Part = as.factor(as.integer(Part) - min(as.integer(Part)) + 1)) %>% ## reducing the Part identifier
    group_by(Part) %>%
    mutate(time = time - min(time)) %>% # time since start experiment
    ungroup() %>%
    left_join(stim_tab, by = "Stim") %>% 
    mutate(y = height - y, # translating from origin at top (pygame) to bottom (ggplot)
           y_pro = 1 - y_pro)
  return(Data)
}

Before we can use this function, we have to create a list of all Csv files we want to process. The shown default is to read all csv files from directory Data.

csv_files <- dir(path = "Data/",
             pattern = "*.csv",
             recursive = F,
             full.names = T)

D_0 <- read_yz_data(csv_files, Stimuli)

head(D_0)
Exp Part Stim time x y x_pro y_pro Obs File width height hum_like hum_skull hum_eye Face Path Skull Sclera
DemoMS 1 92manipulated2.jpg 0.0000000 221.6066 233.7724 0.2462295 0.7597471 1 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape
DemoMS 1 92manipulated2.jpg 0.1123621 233.4877 232.6674 0.2594308 0.7585194 2 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape
DemoMS 1 92manipulated2.jpg 0.2063122 235.8665 231.6465 0.2620739 0.7573850 3 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape
DemoMS 1 92manipulated2.jpg 0.3031878 235.8662 234.0086 0.2620735 0.7600096 4 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape
DemoMS 1 92manipulated2.jpg 0.3999944 238.4736 237.5755 0.2649706 0.7639728 5 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape
DemoMS 1 92manipulated2.jpg 0.5125248 229.3837 247.0309 0.2548707 0.7744787 6 92manipulated2.jpg 450 450 87 TRUE FALSE 92 Stimuli/92manipulated2.jpg human ape

Last Session

Before continue with AOI coding, we take a look at the data of the last participant only. This is useful to check right after an experiment.

Here we also introduce some basic visualizations of raw data, or overlays to the grid of Stimuli. The following function extracts the data of the last participant only.

get_last_part <- function(data){
  last_part <- 
    distinct(data, Part) %>% 
    filter(as.numeric(Part) == max(as.numeric(Part), na.rm = T)) %>% 
    left_join(data, by = "Part")
  return(last_part)
}

Here we extend the stimulus grid G_0 with points of measurement. See how the geom_point function takes a data argument, which allows us to specify a different data frame than the one used for the base ggplot object G_0 (which is the table of stimuli).

Note that the individual figures are being scaled by the eye tracking measures.

G_0 +
  geom_point(aes(x = x, y = y),
             size = 2,
             col = "red",
             alpha = .2,
             data = get_last_part(D_0),
             inherit.aes = F)

In some studies, eye tracking data is used on theories that predict a certain sequence, such as: first look at the eyes, then at the mouth, then back to the eyes. The following variation shows the path of the last participant, by connecting the points with lines.

G_0 +
  geom_point(aes(x = x, y = y,
                 col = Part),
             data = get_last_part(D_0),
             size = .1,
             inherit.aes = F) +
  geom_line(aes(x = x , 
                y = y,
                group = Part),
                col = "red",
             inherit.aes = F,
            data = get_last_part(D_0))

When we joined in the stimulus meta data, we also imported all the meta data this table specifies. The following code compares the eye tracking data for two stimulus conditions: white vs dark scleras.

G_0 +
  geom_point(aes(x = x, 
                 y = y,
                 col = Sclera),
             size = 2,
             alpha = .2,
             data = get_last_part(D_0),
             inherit.aes = F)

Finally, the following plot shows how data from all participants can be visualized, by coloring by participant identifier.

G_0 +
  geom_point(aes(x = x, y = y,
                 col = Part), # <--
             size = 2,
             data = D_0, # <--
             inherit.aes = F) +
  facet_wrap(~Stim)

Deriving measures

Often, it is relevant to know where someone is looking at, but not always. Here we derive a simple measure, the total distance ther eyes travelled during stimulus presentation.

add_travel <- 
  function(data) 
    mutate(data,  travel = sqrt((x - lag(x))^2 + (y - lag(y))^2))

D_1 <- 
  D_0 %>% 
  mutate(Sequence = as.factor(str_c(Part, Stim, sep = "_"))) %>% 
  group_by(Sequence) %>% 
  add_travel() %>% 
  ungroup() %>% 
  select(Obs, Part, Stim, Face, Sequence, hum_like, 
         Sclera, Skull, time, x, y, travel)
D_1 %>% 
  ggplot(aes(x = travel)) +
  geom_histogram() +
  facet_wrap(~Stim)
## `stat_bin()` using `bins = 30`. Pick better value with `binwidth`.
## Warning: Removed 42 rows containing non-finite outside the scale range
## (`stat_bin()`).

Areas of interest

The main use case for eye tracking is to measure how often or how long, or in which sequence, people look at certain areas of interest (AOI) on the stimulus. What these areas are differs from case to case, here it is the main areas of faces: eyes, mouth and nose.

AOIs are defined as rectangles on the stimulus images. The following function reads in the AOI definitions from a CSV file, and merges them with the stimulus meta data to create absolute coordinates for each AOI on each stimulus.

Reading AOI

The function defined here makes againn use of a join operation, which merges the stimulus meta data with the AOI rectangle definitions. Different to the previous join, the linking iodentifier is not the Stimulus ID, but the face. This is a difference, as in this experiment, the same face appeared in two conditions regarding the sclera color. This is also the reason why the join is many-to-many.

read_yz_aoi <- 
  function(file = "Stimuli/AOI.csv",
           stim_dir = "Stimuli/",
           col_types = cols(AOI = col_character(), Face = col_character(), 
                            x = col_double(), y = col_double(), 
                            w = col_double(), h = col_double()),
           stim_tab) {
    
    read_csv(file, col_types = col_types) %>% 
      rename(x_aoi = x, y_aoi = y, w_aoi = w, h_aoi = h) %>% 
      right_join(stim_tab, by = "Face", relationship = "many-to-many") %>% 
      mutate(xmin = x_aoi, 
             xmax = x_aoi + w_aoi,
             ymax = height - y_aoi, ## reversing the y coordinates
             ymin = (height - y_aoi) - h_aoi) %>% 
      arrange(Face, AOI)
  }
AOI <- read_yz_aoi(stim_tab = Stimuli)
head(AOI)
Face AOI x_aoi y_aoi w_aoi h_aoi File width height hum_like hum_skull hum_eye Path Skull Sclera Stim xmin xmax ymax ymin
18 eyes 130 90 110 50 18.jpg 450 450 38 FALSE TRUE Stimuli/18.jpg ape human 18.jpg 130 240 360 310
18 eyes 130 90 110 50 18manipulated.jpg 450 450 38 FALSE FALSE Stimuli/18manipulated.jpg ape ape 18manipulated.jpg 130 240 360 310
18 mouth 110 225 135 35 18.jpg 450 450 38 FALSE TRUE Stimuli/18.jpg ape human 18.jpg 110 245 225 190
18 mouth 110 225 135 35 18manipulated.jpg 450 450 38 FALSE FALSE Stimuli/18manipulated.jpg ape ape 18manipulated.jpg 110 245 225 190
18 nose 145 150 75 50 18.jpg 450 450 38 FALSE TRUE Stimuli/18.jpg ape human 18.jpg 145 220 300 250
18 nose 145 150 75 50 18manipulated.jpg 450 450 38 FALSE FALSE Stimuli/18manipulated.jpg ape ape 18manipulated.jpg 145 220 300 250

AOI preview

Like we did before with the stimulus images, we can now create a ggplot object that shows the AOI rectangles on top of the stimulus images. This is very useful for checking if the AOI definitions are correct.

G_1 <- 
  AOI %>% 
  ggplot(aes(xmin = 0, xmax = width, 
             ymin = 0, ymax = height)) +
  facet_wrap(~Face) + # <--
  ggimg::geom_rect_img(aes(img = Path)) +
  geom_rect(aes(xmin = xmin, ymin = ymin, 
                xmax = xmax, ymax = ymax,
                fill = AOI),
            alpha = .2, 
            inherit.aes  = F)

G_1

AOI Classification

We have the AOI definitions, now we can classify each eye tracking observation according to the AOI it falls into. For this purpose, we do a many-to-many join between the eye tracking data and the AOI definitions, based on the Face identifier. With three AOIs, this creates three data rows from one.

But this is just an intermediate step. Next, we filter only those rows, where the measure falls into the AOI. As long as the AOIs are non-overlapping, there can only be one (or zero) match. Then we check for each observation if it falls into the rectangle defined by the AOI. If so, we keep that observation-AOI combination. Finally, we right-join this result with the original eye tracking data to keep all observations, and assign “Outside” to those observations that did not fall into any AOI.

D_2 <- 
  D_0 %>% 
  left_join(AOI, by = "Face", relationship = "many-to-many") %>% 
  mutate(is_in = x > xmin & x < xmax & y > ymin & y < ymax) %>% 
  filter(is_in) %>% 
  select(Obs, AOI) %>% 
  right_join(D_1, by = "Obs") %>% 
  mutate(AOI = if_else(is.na(AOI), "Outside", AOI)) %>% 
  arrange(Part, time)
D_2 %>% 
  group_by(AOI, Sclera, Skull) %>% 
  summarize(n = n()) %>% 
  ungroup() %>% 
  ggplot(aes(y = n, x = AOI, fill = AOI)) +
  facet_grid(Skull~Sclera) +
  geom_col()
## `summarise()` has grouped output by 'AOI', 'Sclera'. You can override using the
## `.groups` argument.

G_0 +
  geom_count(aes(x = x, y = y, 
                 col = AOI),
             alpha = .5,
             inherit.aes  = F,
             data = D_2)

Measuring visits

A visit is a closed sequence of eye positions in the same region. The following code uses a combined criterion for setting a new visits:

  • the position falls into a different AOI

  • OR: the travel traveled from the previous position exceeds a certain threshold

travel_threshold <- 50

D_3 <-

  D_2 %>%
  group_by(Part, Stim) %>%
  filter(AOI != lag(AOI) | travel > travel_threshold) %>% ## logical OR
  mutate(visit = row_number(),
         duration = lead(time) - time) %>%
  ungroup()

sample_n(D_3, 10)
Obs AOI Part Stim Face Sequence hum_like Sclera Skull time x y travel visit duration
869 Outside 2330 18manipulated.jpg 18 2330_18manipulated.jpg 38 ape ape 42.286823 167.1610 227.4478 50.34718 6 0.3058212
641 eyes 2330 95manipulated3.jpg 95 2330_95manipulated3.jpg 93 ape human 10.192777 186.7610 252.9887 56.68922 2 0.0000000
1415 Outside 27832039 35.jpg 35 27832039_35.jpg 75 human ape 13.799940 312.7428 -224.2512 76.23570 6 0.0999587
711 eyes 2330 29.jpg 29 2330_29.jpg 75 human ape 20.193092 246.5988 266.3204 59.34337 5 0.0000000
207 eyes 1 35manipulated3.jpg 35 1_35manipulated3.jpg 75 ape ape 26.207246 169.8247 282.9479 62.18898 8 0.3041506
1225 Outside 27831026 75manipulated2.jpg 75 27831026_75manipulated2.jpg 40 ape ape 3.876154 190.1445 234.7416 12634.37036 1 0.0318010
832 eyes 2330 34manipulated3.jpg 34 2330_34manipulated3.jpg 72 ape ape 37.486790 250.2501 267.1694 55.01961 5 0.0000000
807 nose 2330 37.jpg 37 2330_37.jpg 75 human ape 34.096725 182.7439 205.5361 53.30248 12 0.7996616
322 eyes 1 18.jpg 18 1_18.jpg 38 human ape 41.407312 173.8425 331.5005 36.66676 3 NA
1303 Outside 27831026 75manipulated2.jpg 75 27831026_75manipulated2.jpg 40 ape ape 6.480261 50.8307 -540.2919 95.56319 79 0.0360994

Plotting visit paths and duration

G_3 <-

  G_0 +
  geom_point(aes(x = x, y = y,
                 size = duration), # <--
             color = "white",
             alpha = .2,
             inherit.aes  = F,
             data = D_3)

G_3
## Warning: Removed 42 rows containing missing values or values outside the scale range
## (`geom_point()`).

G_4 <-
  G_0 +
  geom_path(aes(x = x, y = y,
                col = Part),
            inherit.aes  = F,
            data = D_3) # <--

G_4

Participant-level analysis

Frequencies and durations

D_4 <-
  D_3 %>%
  group_by(Part, Face, AOI, Sclera, Skull) %>%  # <--
  summarize(n_visits = n(),
            total_dur = sum(duration, na.rm = TRUE)) %>%
  ungroup() %>% 
  mutate(congruent = (Sclera == Skull))
## `summarise()` has grouped output by 'Part', 'Face', 'AOI', 'Sclera'. You can
## override using the `.groups` argument.
D_4
Part Face AOI Sclera Skull n_visits total_dur congruent
1 18 Outside ape ape 2 0.2079959 TRUE
1 18 eyes ape ape 5 1.2003860 TRUE
1 18 eyes human ape 1 0.0000000 FALSE
1 18 mouth ape ape 1 0.4952917 TRUE
1 18 nose human ape 2 0.2080021 FALSE
1 27 Outside ape ape 9 1.6818917 TRUE
1 27 eyes ape ape 25 3.5174246 TRUE
1 29 Outside ape ape 2 0.3033233 TRUE
1 29 Outside human ape 3 0.7841923 FALSE
1 29 eyes ape ape 4 2.4987192 TRUE
1 29 eyes human ape 8 2.2080870 FALSE
1 29 nose ape ape 2 0.2067378 TRUE
1 34 Outside ape ape 8 2.1751077 TRUE
1 34 eyes ape ape 5 0.3197100 TRUE
1 34 eyes human ape 10 1.4085233 FALSE
1 35 Outside ape ape 1 0.0960505 TRUE
1 35 eyes ape ape 10 2.3993502 TRUE
1 35 eyes human ape 7 1.6006207 FALSE
1 35 nose ape ape 1 0.1923461 TRUE
1 37 eyes ape ape 17 2.3042741 TRUE
1 37 eyes human ape 17 2.7040896 FALSE
1 37 nose ape ape 1 0.0960007 TRUE
1 75 Outside ape ape 1 0.0968530 TRUE
1 75 Outside human ape 1 0.1920197 FALSE
1 75 eyes ape ape 3 1.5033076 TRUE
1 75 eyes human ape 6 2.5120931 FALSE
1 92 Outside ape human 2 0.4000144 FALSE
1 92 eyes ape human 4 1.6953952 FALSE
1 92 eyes human human 12 2.4005816 TRUE
1 92 mouth ape human 1 0.3041422 FALSE
1 95 eyes ape human 9 2.7992332 FALSE
1 95 eyes human human 11 1.8887534 TRUE
2330 18 Outside ape ape 5 0.8829131 TRUE
2330 18 Outside human ape 5 1.0231845 FALSE
2330 18 eyes human ape 9 0.6903307 FALSE
2330 18 mouth ape ape 3 0.3351448 TRUE
2330 18 mouth human ape 5 0.3996296 FALSE
2330 18 nose ape ape 5 1.7907665 TRUE
2330 18 nose human ape 5 0.4950399 FALSE
2330 27 Outside ape ape 8 1.9990637 TRUE
2330 27 eyes ape ape 9 0.7857876 TRUE
2330 27 mouth ape ape 4 0.7980435 TRUE
2330 27 nose ape ape 6 1.0093987 TRUE
2330 29 Outside ape ape 6 1.7119875 TRUE
2330 29 Outside human ape 1 0.4953139 FALSE
2330 29 eyes ape ape 6 0.5917473 TRUE
2330 29 eyes human ape 9 2.4189792 FALSE
2330 29 nose ape ape 2 0.4957144 TRUE
2330 29 nose human ape 3 0.0932391 FALSE
2330 34 Outside human ape 6 1.1994767 FALSE
2330 34 eyes ape ape 10 2.1098831 TRUE
2330 34 mouth human ape 5 0.8007209 FALSE
2330 34 nose human ape 1 0.6075718 FALSE
2330 35 Outside ape ape 9 1.9029665 TRUE
2330 35 Outside human ape 3 0.6871297 FALSE
2330 35 eyes ape ape 2 0.2886381 TRUE
2330 35 eyes human ape 1 0.1912990 FALSE
2330 35 mouth ape ape 2 0.2084906 TRUE
2330 35 mouth human ape 2 0.2075734 FALSE
2330 35 nose ape ape 1 0.3043332 TRUE
2330 35 nose human ape 11 1.6014738 FALSE
2330 37 eyes ape ape 11 2.3037431 TRUE
2330 37 eyes human ape 1 0.1917040 FALSE
2330 37 mouth human ape 5 0.8001504 FALSE
2330 37 nose human ape 9 1.8075886 FALSE
2330 75 Outside ape ape 5 0.4004493 TRUE
2330 75 Outside human ape 5 1.3131621 FALSE
2330 75 eyes ape ape 2 0.1919389 TRUE
2330 75 mouth ape ape 6 1.0083680 TRUE
2330 75 mouth human ape 3 0.3999889 FALSE
2330 75 nose ape ape 4 1.2951844 TRUE
2330 75 nose human ape 3 0.2885468 FALSE
2330 92 Outside human human 6 0.7034686 TRUE
2330 92 eyes ape human 11 2.5920970 FALSE
2330 92 eyes human human 2 0.1930888 TRUE
2330 92 mouth human human 2 0.2069781 TRUE
2330 92 nose human human 6 1.5053504 TRUE
2330 95 Outside human human 4 0.3856497 TRUE
2330 95 eyes ape human 11 2.3999085 FALSE
2330 95 eyes human human 12 2.5101209 TRUE
27831026 27 Outside ape ape 80 2.9079714 TRUE
27831026 75 Outside ape ape 74 2.6678762 TRUE
27831026 75 eyes ape ape 7 0.1357682 TRUE
27831026 75 mouth ape ape 2 0.0360231 TRUE
27831026 75 nose ape ape 6 0.1002169 TRUE
27832039 18 Outside ape ape 20 2.8999631 TRUE
27832039 27 Outside ape ape 8 2.7999856 TRUE
27832039 34 Outside ape ape 26 2.8999107 TRUE
27832039 35 Outside human ape 27 2.8000491 FALSE
G_6 <-

  D_4 %>%
  ggplot(aes(x = congruent, y = n_visits, fill = AOI)) +
  facet_wrap(~Part) +
  geom_col()

G_6

G_7 <-
  D_4 %>%
  ggplot(aes(x = AOI, y = total_dur, fill = congruent)) +
  facet_wrap(~Part) +
  geom_col()

G_7